Neural networks with periodic and monotonic activation functions: a comparative study in classi cation problems

نویسندگان

  • Josep M. Sopena
  • Enrique Romero
چکیده

This article discusses a number of reasons why the use of non-monotonic functions as activation functions can lead to a marked improvement in the performance of a neural network. Using a wide range of benchmarks we show that a multilayer feed-forward network using sine activation functions (and an appropriate choice of initial parameters) learns much faster than one incorporating sigmoid functions-as much as 150-500 times faster-when both types are trained with backpropagation. Learning speed also compares favorably with speeds reported using modiied versions of the backpropagation algorithm. In addition, computational and generalization capacity increases.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Estimating Posterior Probabilities In Classi...cation Problems With Neural Networks

Classi...cation problems are used to determine the group membership of multi-dimensional objects and are prevalent in every organization and discipline. Central to the classi...cation determination is the posterior probability. This paper introduces the theory and applications of the classi...cation problem, and of neural network classi...ers. Through controlled experiments with problems of kno...

متن کامل

Handwritten Digit Recognition Using Multi-Layer Feedforward Neural Networks with Periodic and Monotonic Activation Functions

The problem of handwritten digit recognition is tackled by multi-layer feedforward neural networks with different types of neuronal activation functions. Three types of activation functions are adopted in the network, namely, the traditional sigmoid function, the sinusoidal function and a periodic function that can be considered as a combination of the first two functions. To speed up the learn...

متن کامل

Classi cation using Bayesian Neural Nets

Recently, Bayesian methods have been proposed for neural networks to solve regression and classi cation problems. These methods claim to overcome some di culties encountered in the standard approach such as over tting. However, an implementation of the full Bayesian approach to neural networks as suggested in the literature applied to classi cation problems is not easy. In fact we are not aware...

متن کامل

Colour texture analysis A comparative study

In this paper we focus on classi cation of colour texture images The main objective is to determine the contribution of colour information to classi cation performance Three relevant approaches to greyscale texture analysis namely Local Linear Trans forms Gabor ltering and Co occurrence are extended to colour images They are evaluated in a quantitative manner by means of a comparative experimen...

متن کامل

Taming the Waves: Sine as Activation Function in Deep Neural Networks

Most deep neural networks use non-periodic and monotonic—or at least quasiconvex— activation functions. While sinusoidal activation functions have been successfully used for specific applications, they remain largely ignored and regarded as difficult to train. In this paper we formally characterize why these networks can indeed often be difficult to train even in very simple scenarios, and desc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008